AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
8 billion parameters

# 8 billion parameters

Aya 23 8B
Aya-23 is an open-weight research version of an instruction fine-tuned model with highly advanced multilingual capabilities, supporting 23 languages.
Large Language Model Transformers Supports Multiple Languages
A
CohereLabs
10.28k
415
Granite 8b Qiskit
Apache-2.0
granite-8b-qiskit is a model with 8 billion parameters, focusing on generating high-quality Qiskit quantum computing code.
Large Language Model Transformers Other
G
Qiskit
1,676
8
Llama 3 Typhoon V1.5 8b
Typhoon-8B is a Thai large language model with 8 billion parameters built on Llama3-8B, focusing on text generation tasks in Thai and English.
Large Language Model Transformers
L
scb10x
4,400
9
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase